Solvency Markov Decision Processes with Interest

نویسندگان

  • Tomás Brázdil
  • Taolue Chen
  • Vojtech Forejt
  • Petr Novotný
  • Aistis Simaitis
چکیده

Solvency games, introduced by Berger et al., provide an abstract framework for modelling decisions of a risk-averse investor, whose goal is to avoid ever going broke. We study a new variant of this model, where, in addition to stochastic environment and fixed increments and decrements to the investor’s wealth, we introduce interest, which is earned or paid on the current level of savings or debt, respectively. We study problems related to the minimum initial wealth sufficient to avoid bankruptcy (i.e. steady decrease of the wealth) with probability at least p. We present an exponential time algorithm which approximates this minimum initial wealth, and show that a polynomial time approximation is not possible unless P = NP. For the qualitative case, i.e. p = 1, we show that the problem whether a given number is larger than or equal to the minimum initial wealth belongs to NP ∩ coNP, and show that a polynomial time algorithm would yield a polynomial time algorithm for mean-payoff games, existence of which is a longstanding open problem. We also identify some classes of solvency MDPs for which this problem is in P. In all above cases the algorithms also give corresponding bankruptcy avoiding strategies. 1998 ACM Subject Classification G.3 Probability and statistics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated decomposition techniques for large discounted Markov decision processes

Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorith...

متن کامل

Sustainability of Current Account Deficits in Turkey: Markov Switching Approach

Countries may face debt problems for periods when the long-run solvency condition about current account deficits holds. Using Markov Switching model, the econometric methodology proposed in this study allows us to distinguish periods that are associated with unsustainable outcomes from those in which the solvency condition holds. Analyzing Turkey’s current account deficits between 1987:4 and 20...

متن کامل

Estimating insurer ́s capital requirements through Markov switching models in the Solvency II framework

Solvency II will transform the system for determining capital requirements for insurers. The new regulatory framework proposes a standard model, but at the same time, it encourages the use of internal models of self-evaluation and risk management. This paper attempts to assess the adequacy of Markov switching models for the design of internal models of insurers' equity risk exposure. We have us...

متن کامل

Modeling Markov Decision Processes with Imprecise Probabilities Using Probabilistic Logic Programming

We study languages that specify Markov Decision Processes with Imprecise Probabilities (MDPIPs) by mixing probabilities and logic programming. We propose a novel language that can capture MDPIPs and Markov Decision Processes with Set-valued Transitions (MDPSTs); we then obtain the complexity of one-step inference for the resulting MDPIPs and MDPSTs. We also present results of independent intere...

متن کامل

A Formalism for Stochastic Decision Processes with Asynchronous Events

We present the generalized semi-Markov decision process (GSMDP) as a natural model for stochastic decision processes with asynchronous events in hope to spur interest in asynchronous models, often overlooked in AI literature.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013